Skip to content

Conversation

@Wassasin
Copy link

@Wassasin Wassasin commented Jan 7, 2026

changelog: [large_futures]: changed lint such that it applies to all expressions of type Future.

Currently large_futures is only applicable to expressions on awaits that are a call to a function of which the type is larger than a given threshold.

This means that the lint does not apply if in the code await is never called on the future. Examples of this include running block_on or similar functions, or if the future is passed to an executor. Also it is possible to create futures that are not explicit functions, like async {} blocks, or types that impl Future directly.

This PR attempts to make this lint applicable to all expressions of the type Future. It will try to find the 'deepest' expression in a branch of the expression tree for which this holds, and does not emit a message for any shallower elements in that branch.

Note that this PR currently fails testing as the UI toolkit seems to be very confused about the emitted spans. Multiple messages seem to be emitted for the same span, and the diagnostics do not seem to match at all. Any help concerning this is very appreciated. Example:

error: there were 2 unmatched diagnostics
  --> tests/ui/large_futures.rs:14:9
   |
14 |         big_fut([0u8; 1024 * 16]).await;
   |         ^^^^^^^^^^^^^^^^^^^^^^^^^
   |         |
   |         Error[clippy::large_futures]: usage of large future with a size of 16385 bytes
   |         Error[clippy::large_futures]: usage of large future with a size of 16385 bytes

Relevant code:

async fn wait() {
    let f = async {
        big_fut([0u8; 1024 * 16]).await;
        //~^ large_futures
    };
    f.await
    //~^ large_futures
}

@github-actions
Copy link

github-actions bot commented Jan 7, 2026

No changes for 93216cf

@Wassasin Wassasin marked this pull request as ready for review January 7, 2026 15:22
@rustbot rustbot added the S-waiting-on-review Status: Awaiting review from the assignee but also interested parties label Jan 7, 2026
@rustbot
Copy link
Collaborator

rustbot commented Jan 7, 2026

r? @dswij

rustbot has assigned @dswij.
They will have a look at your PR within the next two weeks and either review your PR or reassign to another reviewer.

Use r? to explicitly pick a reviewer

@rustbot
Copy link
Collaborator

rustbot commented Jan 12, 2026

This PR was rebased onto a different master commit. Here's a range-diff highlighting what actually changed.

Rebasing is a normal part of keeping PRs up to date, so no action is needed—this note is just to help reviewers.

@ada4a
Copy link
Contributor

ada4a commented Jan 12, 2026

Note that this PR currently fails testing as the UI toolkit seems to be very confused about the emitted spans. Multiple messages seem to be emitted for the same span, and the diagnostics do not seem to match at all. Any help concerning this is very appreciated.

Not sure this will fix your problem completely, but it looks like at least the annotations for async blocks are incorrect. You write them as follows:

async {
    let x = [0i32; 1024 * 16];
    async {}.await;
    dbg!(x);
}
//~^ large_futures

But the diagnostic is emitted at the first line of a multiline object, so it should look like this:

async {
    let x = [0i32; 1024 * 16];
    async {}.await;
    dbg!(x);
}
//~^^^^^ large_futures

(the number of repeated ^s denotes how many lines up the diagnostic is from the comment)

or like this:

//~v large_futures
async {
    let x = [0i32; 1024 * 16];
    async {}.await;
    dbg!(x);
}

@Wassasin
Copy link
Author

Wassasin commented Jan 12, 2026

Not sure this will fix your problem completely, but it looks like at least the annotations for async blocks are incorrect. You write them as follows:

Thank you, that is indeed what I did not comprehend yet. The underlying issue is that I am (apparantly) emitting duplicate lints for the same spans. When running clippy, these seem to be deduplicated. For the tests however this does not happen. I am unsure though why my visitors seem to visit the same expressions multiple times.

What would be the proper way to fix this? Keep a BTreeSet of the spans around and duplicate them myself?

@ada4a
Copy link
Contributor

ada4a commented Jan 12, 2026

The underlying issue is that I am (apparantly) emitting duplicate lints for the same spans. When running clippy, these seem to be deduplicated. For the tests however this does not happen.

Deduplication is disabled in tests, yes. The test suite did emit a note about this:
= note: duplicate diagnostic emitted due to `-Z deduplicate-diagnostics=no`

But it was drowned in the see of other diagnostics, so I understand how it could be overlooked^^

I am unsure though why my visitors seem to visit the same expressions multiple times.

I think this is because LateLintPass::check_expr is called for each nested expression, so your visitors start at already seen nodes.

What would be the proper way to fix this? Keep a BTreeSet of the spans around and duplicate them myself?

I think HirIdSet would be a better fit, but otherwise yes. This might even make the lint run faster, as you don't rerun the whole visitor unnecessarily

@ada4a
Copy link
Contributor

ada4a commented Jan 12, 2026

^^ after rustfix is applied, all errors should be gone, but weren't

Ok this error message is a bit cryptic as well I imagine...

The reason is that you only provide an autofix for async expressions, but not functions -- so after the autofixes are applied, the warnings on async functions are still emitted.

This is fine -- the way you handle this is as follows:

  1. add a separate test file, conventionally called large_futures_unfixable.rs
  2. move the async function test case to it
  3. annotate the file with //@ no-rustfix to avoid autofixes entirely
  • you could add a comment explaining why it's necessary, for future readers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

S-waiting-on-review Status: Awaiting review from the assignee but also interested parties

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants